نوشته شده توسط : student

Sheet1

 

 

 

 

 

شماره دانشجويينمره امتحان

 

 

89103692 29

89106619

34

89105411

19

89105636

29

89102171

89101345

22

89100951

19

89106105

28

89102963

47

89108803

26

89110439

23

89102714

89109938

7

89110388

89104889

31

89110417

37

89102125

33

89102906

19

89101742

39

89105206

30

89105466

43

89101601

33

89105896

10

89104364

31

89102471

35

89110271

25

87106348

32

89102103

28

89102811

23

89103873

21

89103249

29

89105358

26

89102403

19

89108739

26

89101645

40

89104245

43

89103584

31

89105766

87105908

41

89102566

33

89011192

1

89104031

18

89106387

30

89103913

32

89107897

33

89102928

45

89108077

10

89103721

32

89107531

11

87100237

Page 1

15

 

 

 

 

 

 



:: بازدید از این مطلب : 1003
|
امتیاز مطلب : 139
|
تعداد امتیازدهندگان : 41
|
مجموع امتیاز : 41
تاریخ انتشار : 9 آبان 1389 | نظرات ()
نوشته شده توسط : حسین سلطانی

Before 1900

People have been using mechanical devices to aid calculation for thousands of years. For example, the abacus probably existed in Babylonia (present-day Iraq) about 3000 B.C.E. The ancient Greeks developed some very sophisticated analog computers. In 1901, an ancient Greek shipwreck was discovered off the island of Antikythera. Inside was a salt-encrusted device (now called the Antikythera mechanism) that consisted of rusted metal gears and pointers. When this c. 80 B.C.E. device was reconstructed, it produced a mechanism for predicting the motions of the stars and planets. (More Antikythera info here.)

John Napier (1550-1617), the Scottish inventor of logarithms, invented Napier's rods (sometimes called "Napier's bones") c. 1610 to simplify the task of multiplication.

In 1641 the French mathematician and philosopher Blaise Pascal (1623-1662) built a mechanical adding machine. Similar work was done by Gottfried Wilhelm Leibniz (1646-1716). Leibniz also advocated use of the binary system for doing calculations.

Recently it was discovered that Wilhelm Schickard (1592-1635), a graduate of the University of Tübingen (Germany), constructed such a device in 1623-4, before both Pascal and Leibniz. A brief description of the device is contained in two letters to Johannes Kepler. Unfortunately, at least one copy of the machine burned up in a fire, and Schickard himself died of bubonic plague in 1635, during the Thirty Years' War.

Joseph-Marie Jacquard (1752-1834) invented a loom that could weave complicated patterns described by holes in punched cards. Charles Babbage (1791-1871) worked on two mechanical devices: the Difference Engine and the far more ambitious Analytical Engine (a precursor of the modern digital computer), but neither worked satisfactorily. (Babbage was a bit of an eccentric -- one biographer calls him an "irascible genius" -- and was probably the model for Daniel Doyce in Charles Dickens' novel, Little Dorrit. A little-known fact about Babbage is that he invented the science of dendrochronology -- tree-ring dating -- but never pursued his invention. In his later years, Babbage devoted much of his time to the persecution of street musicians (organ-grinders).) The Difference Engine can be viewed nowadays in the Science Museum in London, England.

One of Babbage's friends, Ada Augusta Byron, Countess of Lovelace (1815-1852), sometimes is called the "first programmer" because of a report she wrote on Babbage's machine. (The programming language Ada was named for her.)

William Stanley Jevons (1835-1882), a British economist and logician, built a machine in 1869 to solve logic problems. It was "the first such machine with sufficient power to solve a complicated problem faster than the problem could be solved without the machine's aid." (Gardner) It is now in the Oxford Museum of the History of Science.

Herman Hollerith (1860-1929) invented the modern punched card for use in a machine he designed to help tabulate the 1890 census.

 


1900 - 1939: The Rise of Mathematics

Work on calculating machines continued. Some special-purpose calculating machines were built. For example, in 1919, E. O. Carissan (1880-1925), a lieutenant in the French infantry, designed and had built a marvelous mechanical device for factoring integers and testing them for primality. The Spaniard Leonardo Torres y Quevedo (1852-1936) built some electromechanical calculating devices, including one that played simple chess endgames.

In 1928, the German mathematician David Hilbert (1862-1943) addressed the International Congress of Mathematicians. He posed three questions: (1) Is mathematics complete; i.e. can every mathematical statement be either proved or disproved? (2) Is mathematics consistent, that is, is it true that statements such as "0 = 1" cannot be proved by valid methods? (3) Is mathematics decidable, that is, is there a mechanical method that can be applied to any mathematical assertion and (at least in principle) will eventually tell whether that assertion is true or not? This last question was called the Entscheidungsproblem.

In 1931, Kurt Gödel (1906-1978) answered two of Hilbert's questions. He showed that every sufficiently powerful formal system is either inconsistent or incomplete. Also, if an axiom system is consistent, this consistency cannot be proved within itself. The third question remained open, with 'provable' substituted for 'true'.

In 1936, Alan Turing (1912-1954) provided a solution to Hilbert's Entscheidungsproblem by constructing a formal model of a computer -- the Turing machine -- and showing that there were problems such a machine could not solve. One such problem is the so-called "halting problem": given a Pascal program, does it halt on all inputs?

 


1940's: Wartime brings the birth of the electronic digital computer

The calculations required for ballistics during World War II spurred the development of the general-purpose electronic digital computer. At Harvard, Howard H. Aiken (1900-1973) built the Mark I electromechanical computer in 1944, with the assistance of IBM.

Military code-breaking also led to computational projects. Alan Turing was involved in the breaking of the code behind the German machine, the Enigma, at Bletchley Park in England. The British built a computing device, the Colossus, to assist with code-breaking.

At Iowa State University in 1939, John Vincent Atanasoff (1904-1995) and Clifford Berry designed and built an electronic computer for solving systems of linear equations, but it never worked properly.

Atanasoff discussed his invention with John William Mauchly (1907-1980), who later, with J. Presper Eckert, Jr. (1919-1995), designed and built the ENIAC, a general-purpose electronic computer originally intended for artillery calculations. Exactly what ideas Mauchly got from Atanasoff is not complely clear, and whether Atanasoff or Mauchly and Eckert deserve credit as the originators of the electronic digital computer was the subject of legal battles and ongoing historical debate. The ENIAC was built at the Moore School at the University of Pennsylvania, and was finished in 1946.

In 1944, Mauchly, Eckert, and John von Neumann (1903-1957) were already at work designing a stored-program electronic computer, the EDVAC. Von Neumann's report, "First Draft of a Report on the EDVAC", was very influential and contains many of the ideas still used in most modern digital computers, including a mergesort routine. Eckert and Mauchly went on to build UNIVAC.

Meanwhile, in Germany, Konrad Zuse (1910-1995) built the first operational, general-purpose, program-controlled calculator, the Z3, in 1941. More information about Zuse can be found here.

In 1945, Vannevar Bush published a surprisingly prescient article in the Atlantic Monthly about the ways information processing would affect the society of the future. (Another copy of the Bush article appears here.)

Maurice Wilkes (b. 1913), working in Cambridge, England, built the EDSAC, a computer based on the EDVAC. F. C. Williams (b. 1911) and others at Manchester University built the Manchester Mark I, one version of which was working as early as June 1948. This machine is sometimes called the first stored-program digital computer.

The invention of the transistor in 1947 by John Bardeen (1908-1991), Walter Brattain (1902-1987), and William Shockley (1910-1989) transformed the computer and made possible the microprocessor revolution. For this discovery they won the 1956 Nobel Prize in physics. (Shockley later became notorious for his racist views.)

Jay Forrester (b. 1918) invented magnetic core memory c. 1949. More about Forrester here.

 


1950's

Grace Murray Hopper (1906-1992) invented the notion of a compiler, at Remington Rand, in 1951. Earlier, in 1947, Hopper found the first computer "bug" -- a real one -- a moth that had gotten into the Harvard Mark II. (Actually, the use of ``bug'' to mean defect goes back to at least 1889.)

John Backus and others developed the first FORTRAN compiler in April 1957. LISP, a list-processing language for artificial intelligence programming, was invented by John McCarthy about 1958. Alan Perlis, John Backus, Peter Naur and others developed Algol.

In hardware, Jack Kilby (Texas Instruments) and Robert Noyce (Fairchild Semiconductor) invented the integrated circuit in 1959.

Edsger Dijkstra invented an efficient algorithm for shortest paths in graphs as a demonstration of the ARMAC computer in 1956. He also invented an efficient algorithm for the minimum spanning tree in order to minimize the wiring needed for the X1 computer. (Dijkstra is famous for his caustic, opinionated memos. For example, see his opinions of some programming languages).

In a famous paper that appeared in the journal Mind in 1950, Alan Turing introduced the Turing Test, one of the first efforts in the field of artificial intelligence. He proposed a definition of "thinking" or "consciousness" using a game: a tester would have to decide, on the basis of written conversation, whether the entity in the next room responding to the tester's queries was a human or a computer. If this distinction could not be made, then it could be fairly said that the computer was "thinking".

In 1952, Alan Turing was arrested for "gross indecency" after a burglary led to the discovery of his affair with Arnold Murray. Overt homosexuality was taboo in 1950's England, and Turing was forced to take estrogen "treatments" which rendered him impotent and caused him to grow breasts. On June 7, 1954, despondent over his situation, Turing committed suicide by eating an apple laced with cyanide.

 


1960's

In the 1960's, computer science came into its own as a discipline. In fact, the term was coined by George Forsythe, a numerical analyst. The first computer science department was formed at Purdue University in 1962. The first person to receive a Ph. D. from a computer science department was Richard Wexelblat, at the University of Pennsylvania, in December 1965.

Operating systems saw major advances. Fred Brooks at IBM designed System/360, a line of different computers with the same architecture and instruction set, from small machine to top-of-the-line. Edsger Dijkstra at Eindhoven designed the THE multiprogramming system.

At the end of the decade, ARPAnet, a precursor to today's Internet, began to be constructed.

Many new programming languages were invented, such as BASIC (developed c. 1964 by John Kemeny (1926-1992) and Thomas Kurtz (b. 1928)).

The 1960's also saw the rise of automata theory and the theory of formal languages. Big names here include Noam Chomsky and Michael Rabin. Chomsky later became well-known for his theory that language is "hard-wired" in human brains, and for his criticism of American foreign policy.

Proving correctness of programs using formal methods also began to be more important in this decade. The work of Tony Hoare played an important role. Hoare also invented Quicksort.

Douglas C. Englebart invents the computer mouse c. 1968, at SRI.

Ted Hoff (b. 1937) and Federico Faggin at Intel designed the first microprocessor (computer on a chip) in 1969-1971.

A rigorous mathematical basis for the analysis of algorithms began with the work of Donald Knuth (b. 1938), author of 3-volume treatise entitled The Art of Computer Programming.

 


1970's

The theory of databases saw major advances with the work of Edgar F. Codd on relational databases. Codd won the Turing award in 1981.

Unix, a very influential operating system, was developed at Bell Laboratories by Ken Thompson (b. 1943) and Dennis Ritchie (b. 1941). Brian Kernighan and Ritchie together developed C, an influential programming language.

Other new programming languages, such as Pascal (invented by Niklaus Wirth) and Ada (developed by a team led by Jean Ichbiah), arose.

The first RISC architecture was begun by John Cocke in 1975, at the Thomas J. Watson Laboratories of IBM. Similar projects started at Berkeley and Stanford around this time.

The 1970's also saw the rise of the supercomputer. Seymour Cray (b. 1925) designed the CRAY-1, which was first shipped in March 1976. It could perform 160 million operations in a second. The Cray XMP came out in 1982. Cray Research was taken over by Silicon Graphics.

There were also major advances in algorithms and computational complexity. In 1971, Steve Cook published his seminal paper on NP-completeness, and shortly thereafter, Richard Karp showed that many natural combinatorial problems were NP-complete. Whit Diffie and Martin Hellman published a paper that introduced the theory of public-key cryptography, and a public-key cryptosystem known as RSA was invented by Ronald Rivest, Adi Shamir, and Leonard Adleman.

In 1979, three graduate students in North Carolina developed a distributed news server which eventually became Usenet.


1980's

This decade also saw the rise of the personal computer, thanks to Steve Wozniak and Steve Jobs, founders of Apple Computer.

The first computer viruses are developed c. 1981. The term was coined by Leonard Adleman, now at the University of Southern California.

In 1981, the first truly successful portable computer was marketed, the Osborne I. In 1984, Apple first marketed the Macintosh computer.

In 1987, the US National Science Foundation started NSFnet, precursor to part of today's Internet.

 


1990's and Beyond

Parallel computers continue to be developed.

Biological computing, with the recent work of Len Adleman on doing computations via DNA, has great promise. The Human Genome Project is attempting to sequence all the DNA in a single human being.

Quantum computing gets a boost with the discovery by Peter Shor that integer factorization can be performed efficiently on a (theoretical) quantum computer.

The "Information Superhighway" links more and more computers worldwide.

Computers get smaller and smaller; the birth of nano-technology



:: بازدید از این مطلب : 1713
|
امتیاز مطلب : 121
|
تعداد امتیازدهندگان : 33
|
مجموع امتیاز : 33
تاریخ انتشار : 5 آبان 1389 | نظرات ()
نوشته شده توسط : حسین سلطانی

Graduate Research, Writing, and Careers in Computer Science

Graduate Student Orientation

You can find the slides from the graduate student orientation here

You can find the Iowa State University Computer Science Graduate Student Handbook here

Useful Books on Graduate Studies, Research and Careers in Computer Science

 

You can find a list of extremely useful books on research, scientific ethics, writing, dissertations, and scientific careers that I recommend to all of my graduate students here.

Online Resources on Graduate Study in Computer Science

 

Online Resources on Ethics

Online Resources on Research in Computer Science

Online Resources on Writing, Publication, and Presentation

Grammar and Style

Writing a Thesis

Writing Papers and Abstracts

Presenting Research

Writing Research Proposals

Online Resources for Women in Computer Science

 

Graduate Scholarships and Fellowships

Online Resources on Careers in Computer Science

Online Resources on Life after Graduate School

Humor



:: بازدید از این مطلب : 1969
|
امتیاز مطلب : 100
|
تعداد امتیازدهندگان : 28
|
مجموع امتیاز : 28
تاریخ انتشار : 5 آبان 1389 | نظرات ()
نوشته شده توسط : حسین سلطانی

List of important publications in computer science

This is a list of important publications in computer science, organized by field.

There are a number of reason why a particular publication might be regarded as important:

  • Topic creator - A publication that created a new topic
  • Break through - A publication that changed scientific knowledge significantly
  • Introduction - A publication that is a good introduction or survey of a topic
  • Impact - A publication which had a major impact on the world or on the research
  • Latest and greatest - The current most advanced result in a topic

 

Computability

On computable numbers, with an application to the Entscheidungsproblem

  • Alan Turing
  • Proceedings of the London Mathematical Society, Series 2, 42 (1936), pp 230-265. Errata appeared in Series 2, 43 (1937), pp 544-546.
  • Online version

Description: Here it all began - The Turing Machine Importance: Topic creator, Break through, Impact

 

Computational complexity theory

On the computational complexity of algorithms

Description: This paper gave computational complexity its name and seed. Importance: Topic creator, Break through, Impact

 

The complexity of theorem proving procedures

  • S. A. Cook
  • Proceedings of the 3rd Annual ACM Symposium on Theory of Computing (1971), pp. 151--158.

Description: This paper introduced the concept of NP-Completeness and proved that SAT is NP-Complete.

Importance: Topic creator, Break through, Impact

Reducibility among combinatorial problems

  • R. M. Karp
  • In R. E. Miller and J. W. Thatcher, editors, Complexity of Computer Computations, pages 85-103. Plenum Press, New York, NY, 1972.

Description: This paper showed that 21 different papers are NP-Complete and showed the importance of the concept.

Importance: Impact

 

Computers and Intractability: A Guide to the Theory of NP-Completeness

Description: The main importance of this book is due to its extensive list of more than 300 NP-Complete problems. This list became a common reference and definition. It is important to note that though the book was published only few years after the concept was defined such an extensive list was found.

Importance: Introduction, Impact, Latest and greatest

 

The Knowledge Complexity of Interactive Proof Systems

Description: This paper introduced the concept of zero knowledge.

Importance: Topic creator, Break through

 

How to Construct Random Functions

Description: This paper showed that the existence of one way functions leads to computational randomness.

Importance: Topic creator, Break through, Latest and greatest

 

Computational Complexity

Description: This book provides a very good introduction to Computational Complexity Importance: Introduction

 

Algorithms

A machine program for theorem proving

Description: The DLL algorithm. The basic algorithm for SAT and other NP-Complete problems.

Importance: Break through, Impact

 

Optimization by simulated annealing

Description: A very common heuristic for NP-Complete problems. Importance: Impact

 

The Art of Computer Programming

Description: This set of textbooks used to be very popular algorithms books. The algorithms were written in the MIX assembly language. Due to that, the algorithms were very precise but not very readable... Importance: Impact

 

Introduction to algorithms

Description: As its name indicates this text book is a very good introduction to algorithms. This book became so popular that it is almost de facto standard for basic algorithms teaching. The only problem with this 1028 pages book is that it might cause severe pain when falling on your foot. Importance: Introduction, Impact

 

 

Communication theory

A mathematical theory of communication

Description: This paper created communication theory and information theory.

Importance: Topic creator, Break through, Introduction, Impact

 

Information theory

A mathematical theory of communication

Description: This paper created communication theory and information theory.

Importance: Topic creator, Break through, Introduction, Impact

 

A Method for the Construction of Minimum Redundancy Codes

  • David A. Huffman
  • Proceedings of the Institute of Radio Engineers, September 1952, Volume 40, Number 9, pp. 1098-1101.

Description: The Huffman coding.

Importance: Impact, Break trough

 

A Universal Algorithm for Sequential Data Compression

  • Jacob Ziv
  • Abraham Lempel
  • IEEE Transactions on Information Theory, Vol. 23, No. 3, pp. 337-343.
  • http://citeseer.nj.nec.com/ziv77universal.html

Description: The LZ77 compression algorithm.

Importance: Impact, Break trough

 

Elements of Information Theory

Description: A good and popular introduction to information theory.

Importance: Impact, Introduction

 

Databases

 

A relational model for large shared data bank

  • E. F. Codd
  • Communications of the ACM, 13(6):377-- 387, June 1970

Description: This paper introduced the relational model for databases. This model became the number one model.

Importance: Topic creator, Break through, Impact

 

The Entity Relationship Model - Towards a Unified View of Data

  • P.P-S. Chen
  • ACM Transactions on Database Systems, Vol. 1, No. 1, March 1976, pp. 9-36

Description: This paper introduced the ERD method of database design. Importance: Break through, Impact

 

Mining association rules between sets of items in large databases

Description: Association rules, a very common method for data mining.

Importance: Topic creator, Introduction, Impact

 

Cryptography

New directions in cryptography

Description: This paper suggested public key cryptography and invented Diffie-Hellman key exchange.

Importance: Topic creator, Break through, Introduction, Impact, Latest and greatest (A great paper from every perspective...)

 

A Method for Obtaining Digital Signatures and Public Key Cryptosystems

Description: The RSA encryption method. The first public key method.

Importance: Break through, Impact

 

How to Share a Secret

  • Shamir, A.
  • Comm. Assoc. Comput. Mach., vol.22, no.11, pp.612--613 (Nov. 1979)

Description: A safe method for sharing a secret.

Importance: Topic creator, Break through

 

Machine learning

Language identification in the limit

  • E. M. Gold
  • Information and Control, 10:447--474, 1967

Description: This paper created Algorithmic learning theory.

Importance: Topic creator, Break through, Impact

 

On the uniform convergence of relative frequencies of events to their probabilities

Description: Statistical learning theory, statistical uniform convergence and the VC dimension. Importance: Break through, Impact

 

A theory of the learnable

  • Valiant, L.G.
  • Communications of the ACM, 27(11): 1134--1142 (1984)

Description: The Probably approximately correct learning (PAC learning) framework.

Importance: Topic creator, Break through, Impact

 

Compilers

YACC: Yet another compiler-compiler

Description: Yacc is a tool that made compiler writing much easier.

Importance: Impact

 

Compilers: Principles, Techniques and Tools

Description: This book became a classic in compiler writing. It is also known as the Dragon book, after the (red) dragon that appears on its cover.

Importance: Introduction, Impact

 

Software engineering

GOTO Considered Harmful

Description: Don't use goto - the beginning of structured programming.

Importance: Topic creator, Impact

 

No Silver Bullet :Essence and accidents of software engineering

  • F. Brooks
  • Computer, 20(4):10-19, April 1987
  • http://www.virtualschool.edu/mon/SoftwareEngineering/BrooksNoSilverBullet.html

Description: We will keep having problems with software...

Importance: Impact

 

The Cathedral and the Bazaar

Description: Open source methodology.

Importance: Impact

 

Design Patterns: Elements of Reusable ObjectOriented Software

Description: This book was the first to deifne and list design patterns in computer science

Importance: Topic creator, Impact

 

Computer networks

Ethernet: Distributed packet switching for local computer networks

Description: The Ethernet protocol.

Importance: Impact, Latest and greatest



:: بازدید از این مطلب : 1449
|
امتیاز مطلب : 57
|
تعداد امتیازدهندگان : 20
|
مجموع امتیاز : 20
تاریخ انتشار : 5 آبان 1389 | نظرات ()
نوشته شده توسط : حسین سلطانی

Computer science

In its most general sense, computer science (CS) is the study of computation and information processing, both in hardware and in software. In practice, computer science includes a variety of topics relating to computers, which range from the abstract analysis of algorithms to more concrete subjects like programming languages, software, and computer hardware. As a scientific discipline, it differs significantly from mathematics, programming, software engineering, and computer engineering, although these fields are often confused.

 

Computer science is no more about computers than astronomy is about telescopes
- Edsger Dijkstra

 

Computer science is not as old as physics; it lags by a couple of hundred years. However, this does not mean that there is significantly less on the computer scientist's plate than on the physicist's: younger it may be, but it has had a far more intense upbringing!
- Richard Feynman

The Church-Turing thesis states that all known kinds of general computing devices are essentially equivalent in what they can do, although they vary in time and space efficiency. This thesis is sometimes treated as the fundamental principle of computer science. Most research in computer science has been related to von Neumann computers or Turing machines (computers that do one small, deterministic task at a time), because they resemble most real computers in use today. Computer scientists also study other kinds of machines, some practical (like parallel machines) and some theoretical (like random, oracle and quantum machines).

Computer scientists study what programs can and cannot do (computability and artificial intelligence), how programs should efficiently perform specific tasks (algorithms), how programs should store and retrieve specific kinds of information (data structures), and how programs and people should communicate with each other (user interfaces and programming languages).

Computer science has roots in electrical engineering, mathematics and linguistics. In the last third of the 20th century computer science has become recognized as a distinct discipline and has developed its own methods and terminology.

The first computer science department in the United States was founded at Purdue University in 1962. The University of Cambridge in England, among others, taught CS prior to this, however at the time, CS was seen as a branch of mathematics, and not a separate department. Cambridge claims to have the world's oldest taught qualification in computing. Most universities today have specific departments devoted to computer science.

The highest honor in computer science is the Turing Award.

 

Related fields

Computer science is closely related to several other fields. These fields overlap considerably, though important differences exist

 

  • Information science is the study of data and information, including how to interpret, analyze, store, and retrieve it. Information science started as the foundation of scientific analysis of communication and databases.
  • Software engineering emphasizes analysis, design, and construction of useful software using contemporary tools and practices.
  • Information systems is the application of computing to support the operations of an organization: operating, installing, and maintaining the computers, software, and data. Management information systems is a key subfield that emphasizes financial and personnel management.

     

  • Computer engineering is about the analysis, design, and construction of computer hardware.
  • Information security is about the analysis and implementation of information system security (cryptography is included).

 

Major fields of importance for computer science

 

Mathematical foundations

 

Theoretical computer science

 

Hardware

(see also electrical engineering)

 

Computer systems organization

(see also electrical engineering)

 

Software

 

Data and information systems

 

Computing methodologies

 

Computer applications

 

Computing milieux

 

History

 

Prominent pioneers in computer science

See list of computer scientists for many more notables.

 

See also

 

 

 



:: بازدید از این مطلب : 6361
|
امتیاز مطلب : 48
|
تعداد امتیازدهندگان : 15
|
مجموع امتیاز : 15
تاریخ انتشار : 5 آبان 1389 | نظرات ()
نوشته شده توسط : student

با سلام به همه ی بچه های علوم کامپیوتر  و ریاضی  

 

*****"به زودی چت روم سایت افتتاح شد........"******

 

منتظر اخبار خوب بعدی باشید...

البته گفتنی است که پیشنهاد چت روم از طرف یکی از کاربران بود.

پس پیش نهاد های خود را با ما در میان بگذارید.

 

مدیریت سایت...



:: بازدید از این مطلب : 877
|
امتیاز مطلب : 47
|
تعداد امتیازدهندگان : 16
|
مجموع امتیاز : 16
تاریخ انتشار : 26 مهر 1389 | نظرات ()
نوشته شده توسط : امیرحسین هرندی

یک کامپیوتر کوانتومی ماشینی است که از پدیده ها و قوانین فیزیک کوانتومی برای انجام محاسباتش استفاده می کند. پدیده هایی مانند برهم نهی (Superposition) و درهم تنیدگی (Entanglement).
کامپیوترهای کوانتومی با کامپیوترهای فعلی که با ترانزیستورها کار می کنند تفاوت اساسی دارند. ایده اصلی که در پس کامپیوترهای کوانتومی نهفته است این است که می توان از خواص و قوانین فیزیک کوانتوم برای ذخیره سازی و انجام عملیات روی داده ها استفاده کرد. یک مدل تئوریک و انتزاعی از این ماشین ها، ماشین تورینگ کوانتومی (Quantum Turing Machine) است که کامپیوتر کوانتومی جهانی (Universal Quantum Computer) نیز نامیده می شود.

اگر چه محاسبات کوانتومی تازه در ابتدای راه قرار دارد، اما آزمایش هایی انجام شده که در طی آنها عملیات محاسبات کوانتومی روی تعداد بسیار کمی از کوبیت ها اجرا شده است. تحقیقات نظری و عملی در این زمینه ادامه دارد و بسیاری از موسسات دولتی و نظامی از تحقیقات در زمینه کامپیوترهای کوانتومی چه برای اهداف غیرنظامی و چه برای اهداف امنیتی (مثل تجزیه و تحلیل رمز، Cryptanalysis) حمایت می کنند.
اگر کامپیوترهای کوانتومی در مقیاس بزرگ ساخته شوند، می توانند مسائل خاصی را با سرعت خیلی زیاد حل کنند (برای مثال الگوریتم شُور، Shor's Algorithm). البته باید توجه داشت که توابعی که توسط کامپیوترهای کلاسیک محاسبه پذیر (Computable) نیستند، توسط کامپیوترهای کوانتومی نیز محاسبه پذیر نخواهند بود. این کامپیوترها نظریه چرچ-تورینگ را رد نمی کنند. کامپیوتر های کوانتومی فقط برای ما سرعت بیشتر را به ارمغان می آورند.

مبانی کامپیترهای کوانتومی
حافظه یک کامپیوتر کلاسیک از بیت (Bit) ساخته شده است، به طوری که هر بیت می تواند یا 0 و یا 1 باشد. یک کامپیوتر کوانتومی یک دنباله از کوبیت ها (Quantom Bits) را نگهداری می کند. یک کوبیت می تواند مقادیر 0 ، 1 یا هر برهم نهی حاصل از این دو را داشته باشد [یعنی هر دو حالت 0 و 1 را به صورت هم زمان داشته باشد، مترجم]. بعلاوه، یک زوج از کوبیت ها می توانند در هر برهم نهی از 4 حالت قرار بگیرند و همینطور 3 کوبیت در هر برهم نهی از 8 حالت. در حالت کلی، یک کامپیوتر کوانتومی با n کوبیت می تواند در هر برهم نهی از 2n حالت به طور هم زمان قرار بگیرد(برخلاف کامپیوترهای کلاسیک که در هر لحظه فقط در یکی از این حالت ها می توانند قرار بگیرند).
یک کامپیوتر کوانتومی با استفاده از یک سری گیت های منطقی کوانتومی (Quantum Logic Gates) می تواند کوبیت ها را دستکاری کند. دنباله ای از گیت هایی که برای این منظور استفاده می شود را یک الگوریتم کوانتومی (Quantum Algorithm) می گویند.
یک مثال برای نحوه پیاده سازی کوبیت ها در کامپیوترهای کوانتومی، استفاده از ذراتی با دو حالت اسپینی بالا و پایین است. که معمولا با علامت های  و  ، یا  و نمایش داده می شوند. اما در واقع هر سیستمی که یک کمیت قابل مشاهده (observable quantity) داشته باشد به طوری که این کمیت با گذشت زمان تغییر نکند و حداقل دو مقدار ویژه (Eigenvalues) پشت سر هم و مجزا داشته باشد، کاندید مناسبی برای پیاده سازی یک کوبیت است. چنین سیستمی جوابگوی نیاز ما است زیرا میتواند به یک سیستم با اسپین 1/2- نگاشت شود.



:: بازدید از این مطلب : 934
|
امتیاز مطلب : 43
|
تعداد امتیازدهندگان : 15
|
مجموع امتیاز : 15
تاریخ انتشار : 25 مهر 1389 | نظرات ()
نوشته شده توسط : حسین سلطانی

libguides.library.albany.edu/csciا

این لینک  عالی در ارتباط با تحقیق پیرامون رشته علوم کامپیوتر

به شما کمک می کند .



:: بازدید از این مطلب : 953
|
امتیاز مطلب : 48
|
تعداد امتیازدهندگان : 15
|
مجموع امتیاز : 15
تاریخ انتشار : 25 مهر 1389 | نظرات ()
نوشته شده توسط : حسین سلطانی

مهندسی نرم افزار (Software Engineering)
واژه "مهندسی نرم افزار" در بین عامه مردم یک معنی دارد و در بین متخصصین رشته های کامپیوتری یک معنی دیگر!
عموم مردم مباحث کامپیوتری را به دو دسته کلی تقسیم می کنند. سخت افزار و نرم افزار. هر آنچه که به سخت افزار سیستم مربوط می شود را مرتبط با گرایش سخت افزار می دانند و هرآنچه که به نرم افزارهای کامپیوتری مربوط می شود را مرتبط با گرایش نرم افزار می دانند.
این تعریف "مهندسی نر افزار" از دیدگاه عموم مردم است (تعریف عام).
ولی یک تعریف دیگر از واژه "مهندسی نرم افزار" وجود دارد که متخصصین کامپیوتر از آن استفاده می کنند و در واقع تعریف درست و اصولی واژه "مهندسی نرم افزار" است (تعرف خاص).
 
انجمن مهندسی نرم افزار در نهاد بین المللی IEEE ، مهندسی نرم افزار را اینگونه تعریف می کند:
"به کار بستن روش های سیستماتیک، منظم و قابل اندازه گیری برای توسعه و عملیاتی کردن و حفظ و نگهداری نرم افزار و همچنین مطالعه و توسعه این روش ها"

با یک مثال معنای این تعریف روشن تر می شود.
فرض کنید شما یک تیم 20 نفره از برنامه نویسان حرفه ای دارید و یکی از سازمان های بزرگ دولتی به شما پیشنهاد می دهد که یک نرم افزار خاص و مهم را برایشان بسازید و شما هم قبول می کنید.
خوب امروز روز اول کار است! چه کار می کنید؟ آیا مستقیما شروع به برنامه نویسی می کنید؟ اگر مستقیما شروع به برنامه نویسی کنید، به احتمال بسیار زیاد شکست می خورید! (برنامه نویسان با تجربه به خوبی این موضوع  را درک می کنند!)پروژه را چه  طور بین اعضای تیم تقسیم می کنید؟ در واقع چطور مشخص می کنید چه کسی چه کاری را انجام دهد؟ با استفاده از چه روش هایی می توانید بفهمید که این سازمان از شما دقیقا چه نرم افزاری می خواهد؟  نرم افزار را چگونه طراحی می کنید؟ (طراحی  نرم افزار با برنامه نویسی آن فرق دارد).  در طراحی چه نکاتی را رعایت می کنید که این نرم افزار در آینده به راحتی قابل به روز رسانی باشد؟(بحث حفظ و نگهداری نرم افزار) و بسیاری سوال های مهم دیگر.

هدف مهندسی نرم افزار (یا در واقع، مهندسی کردن فرآیند تولید نرم افزار) این است که دانش و فعالیت های شما را برای رسیدن به یک نرم افزار با کیفیت جهت دهی کند. یعنی مراحل ساخت و توسعه نرم افزار را مشخص می کند



:: بازدید از این مطلب : 996
|
امتیاز مطلب : 41
|
تعداد امتیازدهندگان : 15
|
مجموع امتیاز : 15
تاریخ انتشار : 24 مهر 1389 | نظرات ()
نوشته شده توسط : student

کاربرانی که ثبت نام کرده اند برای نوشتن مطالب خود به آدرس زیر مراجعه کنند:

www.eecs.loxblog.ir\vorod

توجه:

user name شما user@eecs می باشد (مثلا:ghamary@eecs)

 

 

 

با تشکر مدیریت سایت

 



:: بازدید از این مطلب : 900
|
امتیاز مطلب : 65
|
تعداد امتیازدهندگان : 19
|
مجموع امتیاز : 19
تاریخ انتشار : 14 مهر 1389 | نظرات ()

صفحه قبل 1 2 صفحه بعد